Modelling Word Meaning using Efficient Tensor Representations

نویسندگان

  • Mike Symonds
  • Peter Bruza
  • Laurianne Sitbon
  • Ian Turner
چکیده

Models of word meaning, built from a corpus of text, have demonstrated success in emulating human performance on a number of cognitive tasks. Many of these models use geometric representations of words to store semantic associations between words. Often word order information is not captured in these models. The lack of structural information used by these models has been raised as a weakness when performing cognitive tasks. This paper presents an efficient tensor based approach to modelling word meaning that builds on recent attempts to encode word order information, while providing flexible methods for extracting task specific semantic information.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On tensor product $L$-functions and Langlands functoriality

‎In the spirit of the Langlands proposal on  Beyond Endoscopy ‎we discuss the explicit relation between the Langlands functorial transfers and automorphic $L$-functions‎. ‎It is well-known that the poles of the $L$-functions have deep impact to the Langlands functoriality‎. ‎Our discussion also‎ ‎includes the meaning of the central value of the tensor product $L$-functions in terms of the Langl...

متن کامل

Unsupervised Learning of Word-Sequence Representations from Scratch via Convolutional Tensor Decomposition

Unsupervised text embeddings extraction is crucial for text understanding in machine learning. Word2Vec and its variants have received substantial success in mapping words with similar syntactic or semantic meaning to vectors close to each other. However, extracting context-aware word-sequence embedding remains a challenging task. Training over large corpus is difficult as labels are difficult ...

متن کامل

Irreducibility of the tensor product of Albeverio's representations of the Braid groups $B_3$ and $B_4$

‎We consider Albeverio's linear representations of the braid groups $B_3$ and $B_4$‎. ‎We specialize the indeterminates used in defining these representations to non zero complex numbers‎. ‎We then consider the tensor products of the representations of $B_3$ and the tensor products of those of $B_4$‎. ‎We then determine necessary and sufficient conditions that guarantee the irreducibility of th...

متن کامل

Fuzzy paraphrases in learning word representations with a lexicon

We figure out a trap that is not carefully addressed in the previous works using lexicons or ontologies to train or improve distributed word representations: For polysemous words and utterances changing meaning in different contexts, their paraphrases or related entities in a lexicon or an ontology are unreliable and sometimes deteriorate the learning of word representations. Thus, we propose a...

متن کامل

Learning Distributed Word Representations for Natural Logic Reasoning

Natural logic offers a powerful relational conception of meaning that is a natural counterpart to distributed semantic representations, which have proven valuable in a wide range of sophisticated language tasks. However, it remains an open question whether it is possible to train distributed representations to support the rich, diverse logical reasoning captured by natural logic. We address thi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011